LW06 correlations testing

29/12/20 v2 PH

Implemented covariance VMI analysis routines (includes filtering).

10/12/20 v1

Basic correlation coefficient for ion and electron channels (raw data, only gas on/off filtering).

Setup

In [1]:
import numpy as np

from h5py import File

from pathlib import Path

# HV imports
import holoviews as hv
from holoviews import opts
hv.extension('bokeh', 'matplotlib')

import xarray as xr


# Memory profiler - OPTIONAL for testing
# https://timothymonteath.com/articles/monitoring_memory_usage/
# %load_ext memory_profiler
# %memit


# Quick hack for slow internet connection!
%autosave 36000

# Also image stacks can be problematic
showImgStacks = False

# Import class - this requires pointing to the `tmoDataBase.py` file.
# See https://github.com/phockett/tmo-dev

# Import class from file
import sys
sys.path.append(r'/cds/home/p/phockett/dev/tmo-dev/tmo')
import tmoDataBase as tb
# from vmi import VMI  # VMI class - currently has multiple filtering, ish.
# from inversion import VMIproc  # VMI processing class
from correlations import corr  # Correlations class

tb.setPlotDefaults()  # Set plot defaults (optional)

# tb.setPlotDefaults()  # Set plot defaults (optional)
Autosaving every 36000 seconds

Read data

In [2]:
# Set main data path
baseDir = Path('/reg/data/ana15/tmo/tmolw0618/scratch/preproc/v5')
keys = range(81,87+1)

# Setup class object and which runs to read.
# data = VMIproc(fileBase = baseDir, runList = range(81,87+1), fileSchema='_v5')
# data = VMIproc(fileBase = baseDir, runList = keys, fileSchema='_v5')  # Testing with single dataset
# data = VMI(fileBase = baseDir, runList = keys, fileSchema='_v5')  # Testing with single dataset
data = corr(fileBase = baseDir, runList = keys, fileSchema='_v5')  # Testing with multiple datasets

# The class has a bunch of dictionaries holding info on the data
# data.runs['files']

# Read in the files with h5py
# There is very basic checking implemented currently, just to see if 'energies' is present.
data.readFiles()
Read 7 files.
Good datasets: [81, 82, 83, 84, 85, 86, 87]
Invalid datasets: []

Filter

In [3]:
# Simple filter - set for gas on only
# TODO: implement multiple filter sets here.
# data.setFilter(reset = True)
data.setFilter({'gas on':{'evrs':[1,1,70]}}, reset = True)  # , 'gas off':{'evrs':[0,0,70]}})
data.filterData()

Calculate correlation matrices (per run)

In [4]:
# Run corrRun() to calculate covariance matrices for each run
data.corrRun()
Set self.ndoverlay, self.hmap and self.layout for dim=0.
Set self.ndoverlay, self.hmap and self.layout for dim=['intensities', 'eShot'].
In [5]:
data.hmap
WARNING:param.HeatMapPlot04584: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
Out[5]:
In [6]:
data.hmap.layout().cols(2)
WARNING:param.HeatMapPlot05514: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05539: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05564: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05589: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05614: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05639: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
WARNING:param.HeatMapPlot05664: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
Out[6]:

Some strong correlations with time, is there anything interesting here...?

TODO:

  • Bin assignments?
  • Multiple parameter filter (energeis and such).
  • Correlated derivied data, e.g. VMI.

Sanity check - for gas off.

In [7]:
# data.setFilter()
data.setFilter({'gas off':{'evrs':[0,0,70]}}, reset = True)  # , 'gas off':{'evrs':[0,0,70]}})
data.filterData()
data.corrRun()
Set self.ndoverlay, self.hmap and self.layout for dim=2.
Set self.ndoverlay, self.hmap and self.layout for dim=['intensities', 'eShot'].
In [8]:
data.hmap
WARNING:param.HeatMapPlot06133: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
Out[8]:

Covariance VMI

In the current implementation this will calculate covariances for electron (x,y) data with another data variable, e.g. ion intensities, on a per-shot basis. The routine uses numpy.corrcoef (https://numpy.org/devdocs/reference/generated/numpy.corrcoef.html) to calculate correlations for each (x,y) pixel with each other data dimension set (e.g. multiple ion species). It's quite slow, but amenable to massive parallelisation with some effort.

To make the code simpler, the data is converted to a Sparse array first (using the Sparse library), standard 3D array referencing is then possible with a minimal memory footprint. It also gives a nice array summary output.

Generate

In [9]:
# Set metrics 
# data.runMetrics()
In [10]:
# For testing, set filter first...
# Simple filter - set for gas on only
# TODO: implement multiple filter sets here.
data.setFilter(reset = True)
data.setFilter({'unsat':{'evrs':[1,1,70], 'eShot':[0,2000,0], 'desc':'Unsaturated signal, electron counts < 2000.'}})  #'gas on':{'evrs':[1,1,70]}, 'gas off':{'evrs':[0,0,70]}})
# Gas on/off, set event code #70. Unsaturated signal, check electron counts < 2000.
data.filterData()
In [11]:
data.filters
Out[11]:
{'unsat': {'evrs': [1, 1, 70],
  'eShot': [0, 2000, 0],
  'desc': 'Unsaturated signal, electron counts < 2000.'}}
In [12]:
# Generate covariance VMI images.
# This takes a while (~30 mins for test dataset) with current implementation, which is looped over pixel and correlated datasets (default is 'intensities' data).
# Note that the default includes x4 downsampling on the (x,y) pixels.
data.genCorrVMIXmulti(saveSparse=True, downsampleRatios = [2,2,1])
Generating covariance VMI images for filters: unsat
Generating sparse data for dataset 81
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 73658)
nnz17076586
Density0.000222833654909523
Read-onlyTrue
Size521.1M
Storage ratio0.0
Generating covariance VMI images for dataset 81
Generating covariance VMI images for covar data intensities, col=0 of 5
/cds/home/p/phockett/.conda/envs/ps-4.0.8-local-clone/lib/python3.7/site-packages/numpy/lib/function_base.py:2559: RuntimeWarning: invalid value encountered in true_divide
  c /= stddev[:, None]
/cds/home/p/phockett/.conda/envs/ps-4.0.8-local-clone/lib/python3.7/site-packages/numpy/lib/function_base.py:2560: RuntimeWarning: invalid value encountered in true_divide
  c /= stddev[None, :]
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 82
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 75402)
nnz22457412
Density0.000286270434760342
Read-onlyTrue
Size685.3M
Storage ratio0.0
Generating covariance VMI images for dataset 82
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 83
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 72980)
nnz22947146
Density0.0003022208872802153
Read-onlyTrue
Size700.3M
Storage ratio0.0
Generating covariance VMI images for dataset 83
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 84
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 74351)
nnz27728206
Density0.00035845497662352616
Read-onlyTrue
Size846.2M
Storage ratio0.0
Generating covariance VMI images for dataset 84
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 85
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 73406)
nnz33486672
Density0.0004384702028983485
Read-onlyTrue
Size1021.9M
Storage ratio0.0
Generating covariance VMI images for dataset 85
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 86
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 92623)
nnz47847925
Density0.000496528203113876
Read-onlyTrue
Size1.4G
Storage ratio0.0
Generating covariance VMI images for dataset 86
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5
Generating sparse data for dataset 87
Set sparse array, dims = ['xc', 'yc', 'shot']
Formatcoo
Data Typeint64
Shape(1020, 1020, 79525)
nnz58751882
Density0.0007100971369679715
Read-onlyTrue
Size1.8G
Storage ratio0.0
Generating covariance VMI images for dataset 87
Generating covariance VMI images for covar data intensities, col=0 of 5
Generating covariance VMI images for covar data intensities, col=1 of 5
Generating covariance VMI images for covar data intensities, col=2 of 5
Generating covariance VMI images for covar data intensities, col=3 of 5
Generating covariance VMI images for covar data intensities, col=4 of 5

*** TESTING seems OK, but filters not working so far? Could be super() issue? Or overwriting data accidentally?

In [13]:
# Show images
data.showImgSet(dims = ['xc','yc'])
WARNING:param.RasterPlot37354: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
In [14]:
data.imgStack
Out[14]:
<xarray.Dataset>
Dimensions:      (intensities: 5, run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * intensities  (intensities) int64 0 1 2 3 4
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc, intensities) float64 nan nan ... -4.906e-08
In [77]:
# Quick save...
# data.corrStack.to_netcdf("LW06_corrStack_test_040121.nc")  # Fails on attribs
data.corrStackCP = data.corrStack.copy()  
data.corrStackCP['unsat'].attrs = {}
data.corrStackCP.to_netcdf("LW06_corrStack_test_040121.nc")  # Runs... 72Mb at 2,2 resample
# ds_disk = xr.open_dataset("LW06_corrStack_test_040121.nc") to load
In [78]:
!ls -ll
total 222132
-rw-r--r-- 1 phockett xu       24 Nov 30 12:56 JR-runner-test-301120.txt
-rw-r--r-- 1 phockett xu   571779 Nov 30 10:26 JR-test_301120_1.html
-rw-r--r-- 1 phockett xu     1429 Nov 30 10:26 JR-test_301120.ipynb
-rw-r--r-- 1 phockett xu  6546283 Dec 10 14:26 LW06_correlations_tests_101220.ipynb
-rw-r--r-- 1 phockett xu 14213513 Jan  3 09:31 LW06_correlations_tests_291220_invTesting_dev.ipynb
-rw-r--r-- 1 phockett xu 10279570 Dec 31 11:24 LW06_correlations_tests_291220_invTesting.ipynb
-rw-r--r-- 1 phockett xu 13455291 Jan  4 08:24 LW06_correlations_tests_291220.ipynb
-rw-r--r-- 1 phockett xu 72846201 Jan  4 08:52 LW06_corrStack_test_040121.nc
-rw-r--r-- 1 phockett xu    85433 Nov 30 17:48 LW06_runs89-97_IvE_datashaderTest_301120.png
-rw------- 1 phockett xu     6759 Nov 30 10:47 nohup.out.r81-87FAIL
-rw------- 1 phockett xu     7902 Nov 30 11:18 nohup.out.r81-87_nbconvFAIL
-rw-r--r-- 1 phockett xu   169057 Dec  1 10:52 TDD_128-137_proc_221120.png
-rw-r--r-- 1 phockett xu  6260479 Nov 27 14:11 VMIproc_LW06_runs128-137_271120_v1.ipynb
-rw-r--r-- 1 phockett xu    14096 Dec  2 09:38 VMIproc_LW06_runs128-140_021220_clean.ipynb
-rw-r--r-- 1 phockett xu  3199902 Dec  2 10:42 VMIproc_LW06_runs128-140_021220.ipynb
-rw-r--r-- 1 phockett xu 21234849 Dec  1 18:27 VMIproc_LW06_runs81-87_271120.ipynb
-rw-r--r-- 1 phockett xu 19968245 Nov 30 13:21 VMIproc_LW06_runs81-87_271120_JRtest2_301120_1.html
-rw-r--r-- 1 phockett xu    14070 Nov 30 12:55 VMIproc_LW06_runs81-87_271120_JRtest2_301120.ipynb
-rw-r--r-- 1 phockett xu    13764 Nov 30 11:04 VMIproc_LW06_runs81-87_271120_JRtest_301120.ipynb
-rw-r--r-- 1 phockett xu 19966244 Nov 30 12:00 VMIproc_LW06_runs81-87_271120_nbconv.html
-rw-r--r-- 1 phockett xu    13504 Nov 30 11:34 VMIproc_LW06_runs81-87_271120_nbconv.ipynb
-rw-r--r-- 1 phockett xu  4991149 Dec  1 17:30 VMIproc_LW06_runs81-87_271120_SCRATCH.ipynb
-rw-r--r-- 1 phockett xu  5883570 Nov 27 15:38 VMIproc_LW06_runs81-87_271120_v1.ipynb
-rw-r--r-- 1 phockett xu  3199120 Dec  2 09:26 VMIproc_LW06_runs81-87_271120_v2_clear.ipynb
-rw-r--r-- 1 phockett xu    14334 Dec  1 19:00 VMIproc_LW06_runs81-87_271120_v2_clear-JR.ipynb
-rw-r--r-- 1 phockett xu  3205720 Dec  3 14:32 VMIproc_LW06_runs81-87_271120_v2_DEV.ipynb
-rw-r--r-- 1 phockett xu 21240095 Dec  3 15:29 VMIproc_LW06_runs81-87_271120_v3.ipynb

Invert

See the VMIproc class guide for details.

In [15]:
# For cpbasex, set the basis path
basisPath = r'/reg/d/psdm/tmo/tmolw0618/scratch/results/tmo_ana/calc/G_r512_k128_l4.h5'

# Set also local version - pip installed version not working with loadG in testing (version mismatch issue, or something in import routine?)
pbasexPath = r'/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex' #/pbasex'

# Import module + basis functions
data.setMethod(pbasexPath = pbasexPath, basisPath = basisPath)
cpBasex import OK, checking for basis functions...
Found basis at /reg/d/psdm/tmo/tmolw0618/scratch/results/tmo_ana/calc/G_r512_k128_l4.h5.
In [118]:
# Q: will this work for data with additional dims...
filterSet = 'unsat'

# Invert a dataset
# data.setCentre(dims = ['xc','yc'], filterSet='gas on')  # This sets [137, 5] with test data, checks wrong dim...?
# data.setCentre([137, 131], dims = ['xc','yc'], filterSet=filterSet)  # 4,4 downsample
# data.setCentre([276, 274], dims = ['xc','yc'], filterSet=filterSet)  # 2,2 downsample
data.setCentre([277, 275], filterSet=filterSet) # , dims = ['xc','yc'], filterSet=filterSet)  # 2,2 downsample
# data.setCentre(filterSet=filterSet)
# data.checkCentre(dims = ['xc','yc'], name='gas on')  # Currently not working? NOW FIXED below with imgStack replacement
data.centreInds
Out[118]:
{'input': [277, 275],
 'dims': ['yc', 'xc'],
 'yc': {'coord': array(277), 'index': array([277])},
 'xc': {'coord': array(275), 'index': array([275])},
 'cList': [277, 275],
 'dMax': [232, 234]}
In [119]:
# data.showImg(name=filterSet, dims = ['xc','yc'])
data.imgStack = data.corrStack[filterSet].sel({'intensities':3}).to_dataset()  # For lower dims to imgStack for testing - should add dims into main routine
# test = data.showImg(name=filterSet, dims = ['xc','yc'], returnImg = True) # Testing direct img return - OK
In [120]:
data.checkCentre(name=filterSet) #, dims = ['yc','xc']) #, name=filterSet)  # Currently not working? - FIXED if imgStack of reduced dims set above.
WARNING:param.OverlayPlot939903: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
In [24]:
# data.inv(filterSet=filterSet)  # Issue with dim stacking here - now have one too many!

# # For rapid plotting per run, use self.plotSpectra()
# data.plotSpectra()
In [ ]:
# [dim for dim in data.imgStack.dims if dim not in ['xc', 'yc', 'run']]
# data.imgStack.var
# list(data.imgStack.groupby(corrDims[0]))
In [137]:
# Try wrapping inv routine for correlation dim

# Set corrStack to avoid overwrite - now set in main routine
# data.corrStack = data.imgStack.copy()
# data.corrStackBK = data.imgStack.copy()

filterSet = 'unsat'

# Check dims
corrDims = [dim for dim in data.corrStack.dims if dim not in ['xc', 'yc', 'run']]

# if len(corrDims) > 1:
#     print()

# for item in data.corrStack[corrDims]:
#     print(item)

# data.imgStack.groupby(corrDims[0]).map(data.inv())  # This could work, but will need some wrapping

data.proc = {}  # MAY ALREADY EXIST, for testing just clear
data.proc['corr'] = {}

# HACK - groupby and process each correlation set separately. Will then need to restack results.
for item in data.corrStack.groupby(corrDims[0]):
    print(item)
    data.imgStack = item[1].transpose('xc','yc',...)  # Force dim ordering for inversion routine - should back-propagate this! NOTE ALSO DIM ORDERING convention, ugh - should pull from centreInds for consistency.
#     data.imgStack = item[1].transpose('yc','xc',...)
    
    # Upsample image - with small data (255 px) getting some weird issues... might be due to radial masking?
    # FULL MANUAL!!! As per http://xarray.pydata.org/en/stable/generated/xarray.Dataset.interp.html#xarray.Dataset.interp
    # Auto-dim based code already written, somewhere...?
#     new_xc = np.linspace(ds.lon[0], ds.lon[-1], ds.dims["lon"] * 4)
    dsRatio = 2
    xcNew = np.linspace(data.imgStack.xc[0], data.imgStack.xc[-1], data.imgStack.dims["xc"] * dsRatio)
    ycNew = np.linspace(data.imgStack.yc[0], data.imgStack.yc[-1], data.imgStack.dims["yc"] * dsRatio)
    data.imgStack = data.imgStack.interp(xc=xcNew, yc=ycNew).fillna(0)   # Note fillna to ensure no NaNs in output image stack.
    
    # Now working OK with upscaling, remember to reset centre coords!
#     data.setCentre([137, 131] * 4, dims = ['xc','yc'], filterSet=filterSet)  
#     data.setCentre([276, 274] * dsRatio, dims = ['xc','yc'], filterSet=filterSet)
#     data.setCentre([277, 275] * dsRatio, dims = ['xc','yc'], filterSet=filterSet)
    data.setCentre([277, 275] * dsRatio, filterSet=filterSet)

    data.inv(filterSet=filterSet)  # This runs, but currently throw errors on coord reassignment.
#     data.inv()
#     data.plotSpectra(filterSet=filterSet)

    data.proc['corr'][item[0]] = data.proc[filterSet].copy()

# Reset to full image stack
# data.imgStack = data.corrStack.copy()
(0, <xarray.Dataset>
Dimensions:      (run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
    intensities  int64 0
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc) float64 nan nan nan ... -5.925e-08 6.536e-08)
/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex/pbasex.py:31: RuntimeWarning: invalid value encountered in true_divide
  betas = IEB[:,nim:].reshape(nx,nl-1,nim)/IE[:,None,:]
*** Inversion routine for filterSet = unsat dropping back to default int coords.
*** Inversion routine for filterSet = unsat dropping back to default int coords.
(1, <xarray.Dataset>
Dimensions:      (run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
    intensities  int64 1
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc) float64 nan nan nan ... -7.208e-08 -3.952e-09)
/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex/pbasex.py:31: RuntimeWarning: invalid value encountered in true_divide
  betas = IEB[:,nim:].reshape(nx,nl-1,nim)/IE[:,None,:]
*** Inversion routine for filterSet = unsat dropping back to default int coords.
*** Inversion routine for filterSet = unsat dropping back to default int coords.
(2, <xarray.Dataset>
Dimensions:      (run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
    intensities  int64 2
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc) float64 nan nan nan ... 1.058e-08 1.165e-08)
/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex/pbasex.py:31: RuntimeWarning: invalid value encountered in true_divide
  betas = IEB[:,nim:].reshape(nx,nl-1,nim)/IE[:,None,:]
*** Inversion routine for filterSet = unsat dropping back to default int coords.
*** Inversion routine for filterSet = unsat dropping back to default int coords.
(3, <xarray.Dataset>
Dimensions:      (run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
    intensities  int64 3
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc) float64 nan nan nan ... 8.417e-08 -1.362e-09)
/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex/pbasex.py:31: RuntimeWarning: invalid value encountered in true_divide
  betas = IEB[:,nim:].reshape(nx,nl-1,nim)/IE[:,None,:]
*** Inversion routine for filterSet = unsat dropping back to default int coords.
*** Inversion routine for filterSet = unsat dropping back to default int coords.
(4, <xarray.Dataset>
Dimensions:      (run: 7, xc: 510, yc: 510)
Coordinates:
  * run          (run) int64 81 82 83 84 85 86 87
  * xc           (xc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
  * yc           (yc) int64 0 1 2 3 4 5 6 7 ... 502 503 504 505 506 507 508 509
    intensities  int64 4
    norm         (run) int64 63130 64625 62557 63722 62924 79393 68166
Data variables:
    unsat        (run, xc, yc) float64 nan nan nan ... 1.937e-08 -4.906e-08)
/reg/d/psdm/tmo/tmolw0618/results/modules/pbasex/pbasex.py:31: RuntimeWarning: invalid value encountered in true_divide
  betas = IEB[:,nim:].reshape(nx,nl-1,nim)/IE[:,None,:]
*** Inversion routine for filterSet = unsat dropping back to default int coords.
*** Inversion routine for filterSet = unsat dropping back to default int coords.
In [138]:
# Restack
# TODO: add for fold, fit and inv too.
xrSet = [data.proc['corr'][k]['xr'].expand_dims({'intensities':[k]}) for k in data.proc['corr'].keys()]

temp = xr.concat(xrSet, 'intensities')
data.proc[filterSet]['xr'] =  temp #.to_dataset()  # temp.rename('CorrXR').to_dataset() 

  #
# data.proc[filterSet]['xr'].name = 'CorrXR' #
# data.proc[filterSet]['xr'] = data.proc[filterSet]['xr'].to_dataset() 
# data.proc[filterSet]['xr'] = xr.combine_nested(xrSet, 'intensities') 
In [132]:
data.imgStack = data.corrStack.copy()
data.showImgSet(dims = ['yc','xc'])
WARNING:param.RasterPlot1422995: Due to internal constraints, when aspect and width/height is set, the bokeh backend uses those values as frame_width/frame_height instead. This ensures the aspect is respected, but means that the plot might be slightly larger than anticipated. Set the frame_width/frame_height explicitly to suppress this warning.
In [139]:
# data.proc[filterSet]
data.setRmask(filterSet=filterSet, XSmin=1e-6, rPix = [0,5], eRange = [1,45])  # Without mask reset this is odd... and also after mask reset - not correctly using all dims?
# ISSUE seems to be with XS dim, but not clear why. Actually looks OK with lower threshold - unnorm values?
hmap = data.plotSpectra(filterSet=filterSet, useMask = True, returnMap = True)   # Throws dim errors...  NOW WORKING AFTER FIXING DATASTRUCTURE!

hmap
# TODO: fix ePlot slicing, assumes dims at the moment. #, ePlot=[1, 45])
Out[139]:
In [30]:
# # Check routine...
# eSpecDS = hv.Dataset(data.proc[filterSet]['xr']) #.rename('CorrXR').to_dataset()) # ['corrXR'])  # As dataarray get blank dataset returned
# hmap = eSpecDS.to(hv.Curve, kdims=['E'])
# # hmap = eSpecDS.to(hv.Curve, kdims=['E'], vdims=['BLM'])
# # hmap = eSpecDS.to(hv.Curve, kdims=['E'], vdims=['BLM'])
# # hmap = eSpecDS.to(hv.HoloMap, kdims=['E'])

# # TODO - remember how to do this properly!!! May need to stack as dict for multidim case?
# # AH - issue was actually cood assignments above. Now working

# # Try manual stacking, sigh
# # curveDict = {}
# # for run in eSpecDS['run'].data:
# #     curveDict.update({(BLM, run):hv.Curve(eSpecDS.select(BLM=BLM, run = run), kdims=['E']) for BLM in eSpecDS['BLM'].data})

# # hmap = hv.HoloMap(curveDict, kdims=['BLM','run'])

# # hmap
In [140]:
hmap.overlay('BLM').layout('run').cols(1)
Out[140]:
In [141]:
hmap.overlay('run').layout('BLM').cols(1)
Out[141]:
In [ ]:
eSpecDS
In [ ]:
data.imgStack['gas on'].sel(run=81).plot.imshow()
In [ ]:
imgIn = data.imgStack #['gas on'] # .data
# imgIn = data.restackVMIdataset(reduce=False)
dimStack = imgIn.dims
# imgIn.transpose(*dimStack[2:],*dimStack[:2])  # OK for da
imgIn.transpose('xc','yc',...)
In [ ]:
dims = ['xc','yc']
step = [4,4]
# resampleDims = {d:s for (d,s) in zip(dims,step)}
# imgIn.resample({d:s for (d,s) in zip(dims,step)}) #, boundary="trim")  # Only single dim for resample
# imgIn.resample(resampleDims.pop())
# imgIn.resample(xc=2).interpolate("linear")  # invalid freq error???

xcNew = np.linspace(imgIn.xc[0], imgIn.xc[-1], imgIn.dims["xc"] * 4)
imgIn.interp(xc=xcNew)
In [ ]:
{d:s for (d,s) in zip(dims,step)}
In [ ]:
resampleDims.pop()
In [41]:
# data.proc[filterSet]  #['fold']
In [ ]:
dims = list(data.imgStack.dims)[-1:0:-1]
dims

# data.imgStack[dims[0]][np.arange(0, data.proc['gas on']['fold'].shape[0])]
try:
    data.imgStack[dims[0]][np.arange(0, data.proc['gas on']['pbasex']['inv'].shape[0])]

except IndexError:
    np.arange(0, data.proc['gas on']['pbasex']['inv'].shape[0])
    
# try:
#     b0 = self.imgStack[dims[0]][np.arange(0,data.shape[0])]
#     b1 = self.imgStack[dims[1]][np.arange(0,data.shape[1])]

# # Default to int coords, this allows for cases where sizes don't match (if resized for instance)
# except IndexError:
#     b0 = np.arange(0,data.shape[0])
#     b1 = np.arange(0,data.shape[1])
In [ ]:
data.proc['gas on']['pbasex']['fit'].shape[0]
In [ ]:
data.proc['gas on']['fold'].shape
In [54]:
data.proc['corr'][0].keys()
Out[54]:
dict_keys(['fold', 'pbasex', 'xr'])
In [83]:
import matplotlib.pyplot as plt

# Check filterset
# plt.imshow(data.proc[filterSet]['fold'][:,:,6])
# plt.imshow(data.proc[filterSet]['pbasex']['fit'][:,:,0])

# Check corr
corrSet = 3
# plt.imshow(data.proc['corr'][corrSet]['fold'][:,:,0])
plt.imshow(data.proc['corr'][corrSet]['pbasex']['fit'][:,:,2])
plt.show()
plt.imshow(data.proc['corr'][corrSet-1]['pbasex']['fit'][:,:,2])
Out[83]:
<matplotlib.image.AxesImage at 0x7f2550fe1290>
In [ ]:
fold = data.proc['gas on']['fold'][:,:,0]
out = data.cp.pbasex(fold, data.gBasis, alpha=3.59e-4, make_images=True)
out
In [84]:
data.proc['gas on']['pbasex']['inv'][:,:,0].shape
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-84-b2f1d11c0c64> in <module>
----> 1 data.proc['gas on']['pbasex']['inv'][:,:,0].shape

KeyError: 'gas on'
In [ ]:
data.corrStackBK = data.corrStack.copy()
In [ ]:
# Check restacking...

# inv routine runs on either:
# self.restackVMIdataset(reduce=False)
# self.imgStack[filterSet].data

# data.imgStack['gas on']
data.restackVMIdataset(reduce=False)

Versions

In [32]:
# tmo-dev class version
data.__version__
Out[32]:
'0.0.1'
In [33]:
import scooby
scooby.Report(additional=['holoviews', 'xarray', 'sparse'])
Out[33]:
Mon Jan 04 07:08:49 2021 PST
OS Linux CPU(s) 12 Machine x86_64
Architecture 64bit RAM 125.7 GB Environment Jupyter
Python 3.7.8 | packaged by conda-forge | (default, Jul 31 2020, 02:25:08) [GCC 7.5.0]
holoviews 1.13.5 xarray 0.16.1 sparse 0.11.2
numpy 1.19.4 scipy 1.5.3 IPython 7.19.0
matplotlib 3.3.2 scooby 0.5.6